12,700 research outputs found

    Analysis of Linsker's simulations of Hebbian rules

    Get PDF
    Linsker has reported the development of center-surround receptive fields and oriented receptive fields in simulations of a Hebb-type equation in a linear network. The dynamics of the learning rule are analyzed in terms of the eigenvectors of the covariance matrix of cell activities. Analytic and computational results for Linsker's covariance matrices, and some general theorems, lead to an explanation of the emergence of center-surround and certain oriented structures. We estimate criteria for the parameter regime in which center-surround structures emerge

    The Role of Constraints in Hebbian Learning

    Get PDF
    Models of unsupervised, correlation-based (Hebbian) synaptic plasticity are typically unstable: either all synapses grow until each reaches the maximum allowed strength, or all synapses decay to zero strength. A common method of avoiding these outcomes is to use a constraint that conserves or limits the total synaptic strength over a cell. We study the dynamic effects of such constraints. Two methods of enforcing a constraint are distinguished, multiplicative and subtractive. For otherwise linear learning rules, multiplicative enforcement of a constraint results in dynamics that converge to the principal eigenvector of the operator determining unconstrained synaptic development. Subtractive enforcement, in contrast, typically leads to a final state in which almost all synaptic strengths reach either the maximum or minimum allowed value. This final state is often dominated by weight configurations other than the principal eigenvector of the unconstrained operator. Multiplicative enforcement yields a “graded” receptive field in which most mutually correlated inputs are represented, whereas subtractive enforcement yields a receptive field that is “sharpened” to a subset of maximally correlated inputs. If two equivalent input populations (e.g., two eyes) innervate a common target, multiplicative enforcement prevents their segregation (ocular dominance segregation) when the two populations are weakly correlated; whereas subtractive enforcement allows segregation under these circumstances. These results may be used to understand constraints both over output cells and over input cells. A variety of rules that can implement constrained dynamics are discussed

    Security and confidentiality approach for the Clinical E-Science Framework (CLEF)

    Get PDF
    Objectives: CLEF is an MRC sponsored project in the E-Science programme that aims to establish methodologies and a technical infrastructure for the next generation of integrated clinical and bioscience research. Methods: The heart of the CLEF approach to this challenge is to design and develop a pseudonymised repository of histories of cancer patients that can be accessed by researchers. Robust mechanisms and policies have been developed to ensure that patient privacy and confidentiality are preserved while delivering a repository of such medically rich information for the purposes of scientific research. Results: This paper summarises the overall approach adopted by CLEF to meet data protection requirements, including the data flows, pseudonymisation measures and additional monitoring policies that are currently being developed. Conclusion: Once evaluated, it is hoped that the CLEF approach can serve as a model for other distributed electronic health record repositories to be accessed for research

    Security and confidentiality approach for the Clinical E-Science Framework (CLEF)

    Get PDF
    CLEF is an MRC sponsored project in the E-Science programme that aims to establish policies and infrastructure for the next generation of integrated clinical and bioscience research. One of the major goals of the project is to provide a pseudonymised repository of histories of cancer patients that can be accessed by researchers. Robust mechanisms and policies are needed to ensure that patient privacy and confidentiality are preserved while delivering a repository of such medically rich information for the purposes of scientific research. This paper summarises the overall approach adopted by CLEF to meet data protection requirements, including the data flows and pseudonymisation mechanisms that are currently being developed. Intended constraints and monitoring policies that will apply to research interrogation of the repository are also outlined. Once evaluated, it is hoped that the CLEF approach can serve as a model for other distributed electronic health record repositories to be accessed for research

    The full set of cnc_n-invariant factorized SS-matrices

    Full text link
    We use the method of the tensor product graph to construct rational (Yangian invariant) solutions of the Yang-Baxter equation in fundamental representations of cnc_n and thence the full set of cnc_n-invariant factorized SS-matrices. Brief comments are made on their bootstrap structure and on Belavin's scalar Yangian conserved charges.Comment: 10p

    Diffraction-limited CCD imaging with faint reference stars

    Get PDF
    By selecting short exposure images taken using a CCD with negligible readout noise we obtained essentially diffraction-limited 810 nm images of faint objects using nearby reference stars brighter than I=16 at a 2.56 m telescope. The FWHM of the isoplanatic patch for the technique is found to be 50 arcseconds, providing ~20% sky coverage around suitable reference stars.Comment: 4 page letter accepted for publication in Astronomy and Astrophysic

    R-matrices and Tensor Product Graph Method

    Full text link
    A systematic method for constructing trigonometric R-matrices corresponding to the (multiplicity-free) tensor product of any two affinizable representations of a quantum algebra or superalgebra has been developed by the Brisbane group and its collaborators. This method has been referred to as the Tensor Product Graph Method. Here we describe applications of this method to untwisted and twisted quantum affine superalgebras.Comment: LaTex 7 pages. Contribution to the APCTP-Nankai Joint Symposium on "Lattice Statistics and Mathematical Physics", 8-10 October 2001, Tianjin, Chin

    Error correcting code using tree-like multilayer perceptron

    Full text link
    An error correcting code using a tree-like multilayer perceptron is proposed. An original message \mbi{s}^0 is encoded into a codeword \boldmath{y}_0 using a tree-like committee machine (committee tree) or a tree-like parity machine (parity tree). Based on these architectures, several schemes featuring monotonic or non-monotonic units are introduced. The codeword \mbi{y}_0 is then transmitted via a Binary Asymmetric Channel (BAC) where it is corrupted by noise. The analytical performance of these schemes is investigated using the replica method of statistical mechanics. Under some specific conditions, some of the proposed schemes are shown to saturate the Shannon bound at the infinite codeword length limit. The influence of the monotonicity of the units on the performance is also discussed.Comment: 23 pages, 3 figures, Content has been extended and revise

    Comprehensive cosmographic analysis by Markov Chain Method

    Full text link
    We study the possibility to extract model independent information about the dynamics of the universe by using Cosmography. We intend to explore it systematically, to learn about its limitations and its real possibilities. Here we are sticking to the series expansion approach on which Cosmography is based. We apply it to different data sets: Supernovae Type Ia (SNeIa), Hubble parameter extracted from differential galaxy ages, Gamma Ray Bursts (GRBs) and the Baryon Acoustic Oscillations (BAO) data. We go beyond past results in the literature extending the series expansion up to the fourth order in the scale factor, which implies the analysis of the deceleration, q_{0}, the jerk, j_{0} and the snap, s_{0}. We use the Markov Chain Monte Carlo Method (MCMC) to analyze the data statistically. We also try to relate direct results from Cosmography to dark energy (DE) dynamical models parameterized by the Chevalier-Polarski-Linder (CPL) model, extracting clues about the matter content and the dark energy parameters. The main results are: a) even if relying on a mathematical approximate assumption such as the scale factor series expansion in terms of time, cosmography can be extremely useful in assessing dynamical properties of the Universe; b) the deceleration parameter clearly confirms the present acceleration phase; c) the MCMC method can help giving narrower constraints in parameter estimation, in particular for higher order cosmographic parameters (the jerk and the snap), with respect to the literature; d) both the estimation of the jerk and the DE parameters, reflect the possibility of a deviation from the LCDM cosmological model.Comment: 24 pages, 7 figure

    Building dynamic capabilities through operations strategy: an empirical example

    Get PDF
    This paper suggests that the implementation of an effective operations strategy process is one of the necessary antecedents to the development of dynamic capabilities within an organisation and that once established, dynamic capabilities and operations strategy process settle into a symbiotic relationship. Key terms and a model of operations strategy process are proposed from literature as a framework for analysing data from a longitudinal case study with a UK based manufacturer of construction materials
    • …
    corecore